Goto

Collaborating Authors

 graph meta-learning



Learning to Propagate for Graph Meta-Learning

Neural Information Processing Systems

Meta-learning extracts the common knowledge from learning different tasks and uses it for unseen tasks. It can significantly improve tasks that suffer from insufficient training data, e.g., few-shot learning. In most meta-learning methods, tasks are implicitly related by sharing parameters or optimizer. In this paper, we show that a meta-learner that explicitly relates tasks on a graph describing the relations of their output dimensions (e.g., classes) can significantly improve few-shot learning. The graph's structure is usually free or cheap to obtain but has rarely been explored in previous works. We develop a novel meta-learner of this type for prototype based classification, in which a prototype is generated for each class, such that the nearest neighbor search among the prototypes produces an accurate classification. The meta-learner, called "Gated Propagation Network (GPN)", learns to propagate messages between prototypes of different classes on the graph, so that learning the prototype of each class benefits from the data of other related classes.



Reviews: Learning to Propagate for Graph Meta-Learning

Neural Information Processing Systems

The prototype embeddings are refined iteratively through the current presentation and neighboring prototypes from similar tasks using a gating mechanism. The refinement process is similar to the multi-head mechanism but is new and unique. The evaluation process on ImageNet is reasonable which demonstrates the effectiveness of the method on both closely-related and distant tasks. However, as the data are created by the authors and not released, it may be hard to reproduce the results based on the current information in the submission. In Line 173, the authors mention it is possible to use the history prototypes for better training performance for GPN model.


Reviews: Learning to Propagate for Graph Meta-Learning

Neural Information Processing Systems

The reviewers agree that the proposed GPN is a novel combination of several components from the literature and represents a good contribution to the meta learning community. Please be sure to include a notation table as requested by one reviewer, along with the additional explanations/clarifications provided in the rebuttal.


Learning to Propagate for Graph Meta-Learning

Neural Information Processing Systems

Meta-learning extracts the common knowledge from learning different tasks and uses it for unseen tasks. It can significantly improve tasks that suffer from insufficient training data, e.g., few-shot learning. In most meta-learning methods, tasks are implicitly related by sharing parameters or optimizer. In this paper, we show that a meta-learner that explicitly relates tasks on a graph describing the relations of their output dimensions (e.g., classes) can significantly improve few-shot learning. The graph's structure is usually free or cheap to obtain but has rarely been explored in previous works.


Unsupervised Episode Generation for Graph Meta-learning

Jung, Jihyeong, Seo, Sangwoo, Kim, Sungwon, Park, Chanyoung

arXiv.org Artificial Intelligence

We investigate Unsupervised Episode Generation methods to solve Few-Shot Node-Classification (FSNC) task via Meta-learning without labels. Dominant meta-learning methodologies for FSNC were developed under the existence of abundant labeled nodes from diverse base classes for training, which however may not be possible to obtain in the real-world. Although a few studies tried to tackle the label-scarcity problem in graph meta-learning, they still rely on a few labeled nodes, which hinders the full utilization of the information of all nodes in a graph. Despite the effectiveness of graph contrastive learning (GCL) methods in the FSNC task without using the label information, they mainly learn generic node embeddings without consideration of the downstream task to be solved, which may limit its performance in the FSNC task. To this end, we propose a simple yet effective unsupervised episode generation method to benefit from the generalization ability of meta-learning for the FSNC task, while resolving the label-scarcity problem. Our proposed method, called Neighbors as Queries (NaQ), generates training episodes based on pre-calculated node-node similarity. Moreover, NaQ is model-agnostic; hence, it can be used to train any existing supervised graph meta-learning methods in an unsupervised manner, while not sacrificing much of their performance or sometimes even improving them. Extensive experimental results demonstrate the potential of our unsupervised episode generation methods for graph meta-learning towards the FSNC task. Our code is available at: https://github.com/JhngJng/NaQ-PyTorch


Robust Graph Meta-learning for Weakly-supervised Few-shot Node Classification

Ding, Kaize, Wang, Jianling, Li, Jundong, Caverlee, James, Liu, Huan

arXiv.org Artificial Intelligence

Graphs are widely used to model the relational structure of data, and the research of graph machine learning (ML) has a wide spectrum of applications ranging from drug design in molecular graphs to friendship recommendation in social networks. Prevailing approaches for graph ML typically require abundant labeled instances in achieving satisfactory results, which is commonly infeasible in real-world scenarios since labeled data for newly emerged concepts (e.g., new categorizations of nodes) on graphs is limited. Though meta-learning has been applied to different few-shot graph learning problems, most existing efforts predominately assume that all the data from those seen classes is gold-labeled, while those methods may lose their efficacy when the seen data is weakly-labeled with severe label noise. As such, we aim to investigate a novel problem of weakly-supervised graph meta-learning for improving the model robustness in terms of knowledge transfer. To achieve this goal, we propose a new graph meta-learning framework -- Graph Hallucination Networks (Meta-GHN) in this paper. Based on a new robustness-enhanced episodic training, Meta-GHN is meta-learned to hallucinate clean node representations from weakly-labeled data and extracts highly transferable meta-knowledge, which enables the model to quickly adapt to unseen tasks with few labeled instances. Extensive experiments demonstrate the superiority of Meta-GHN over existing graph meta-learning studies on the task of weakly-supervised few-shot node classification.


Learning to Propagate for Graph Meta-Learning

LIU, LU, Zhou, Tianyi, Long, Guodong, Jiang, Jing, Zhang, Chengqi

Neural Information Processing Systems

Meta-learning extracts the common knowledge from learning different tasks and uses it for unseen tasks. It can significantly improve tasks that suffer from insufficient training data, e.g., few-shot learning. In most meta-learning methods, tasks are implicitly related by sharing parameters or optimizer. In this paper, we show that a meta-learner that explicitly relates tasks on a graph describing the relations of their output dimensions (e.g., classes) can significantly improve few-shot learning. The graph's structure is usually free or cheap to obtain but has rarely been explored in previous works.